翻訳と辞書
Words near each other
・ Semien Province
・ Semien Shewa
・ Semien Shewa Zone
・ Semi-Slav Defense
・ Semi-slug
・ Semi-Soet
・ Semi-solid metal casting
・ Semi-speaker
・ Semi-square (astrological aspect)
・ Semi-steel
・ Semi-structured data
・ Semi-structured interview
・ Semi-structured model
・ Semi-submarine
・ Semi-submersible
Semi-supervised learning
・ Semi-syllabary
・ Semi-syllable
・ Semi-symmetric graph
・ Semi-synchronous orbit
・ Semi-Thue system
・ Semi-Toned
・ Semi-Tough
・ Semi-trailer
・ Semi-trailer truck
・ Semi-variable cost
・ Semi-vegetarianism
・ Semi-virtual diskette
・ Semi-Yao graph
・ Semi.Official


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Semi-supervised learning : ウィキペディア英語版
Semi-supervised learning

Semi-supervised learning is a class of supervised learning tasks and techniques that also make use of unlabeled data for training - typically a small amount of labeled data with a large amount of unlabeled data. Semi-supervised learning falls between unsupervised learning (without any labeled training data) and supervised learning (with completely labeled training data). Many machine-learning researchers have found that unlabeled data, when used in conjunction with a small amount of labeled data, can produce considerable improvement in learning accuracy. The acquisition of labeled data for a learning problem often requires a skilled human agent (e.g. to transcribe an audio segment) or a physical experiment (e.g. determining the 3D structure of a protein or determining whether there is oil at a particular location). The cost associated with the labeling process thus may render a fully labeled training set infeasible, whereas acquisition of unlabeled data is relatively inexpensive. In such situations, semi-supervised learning can be of great practical value. Semi-supervised learning is also of theoretical interest in machine learning and as a model for human learning.
As in the supervised learning framework, we are given a set of l independently identically distributed examples x_1,\dots,x_l \in X with corresponding labels y_1,\dots,y_l \in Y. Additionally, we are given u unlabeled examples x_,\dots,x_ \in X. Semi-supervised learning attempts to make use of this combined information to surpass the classification performance that could be obtained either by discarding the unlabeled data and doing supervised learning or by discarding the labels and doing unsupervised learning.
Semi-supervised learning may refer to either transductive learning or inductive learning. The goal of transductive learning is to infer the correct labels for the given unlabeled data x_,\dots,x_ only. The goal of inductive learning is to infer the correct mapping from X to Y.
Intuitively, we can think of the learning problem as an exam and labeled data as the few example problems that the teacher solved in class. The teacher also provides a set of unsolved problems. In the transductive setting, these unsolved problems are a take-home exam and you want to do well on them in particular. In the inductive setting, these are practice problems of the sort you will encounter on the in-class exam.
It is unnecessary (and, according to Vapnik's principle, imprudent) to perform transductive learning by way of inferring a classification rule over the entire input space; however, in practice, algorithms formally designed for transduction or induction are often used interchangeably.
==Assumptions used in semi-supervised learning==

In order to make any use of unlabeled data, we must assume some structure to the underlying distribution of data. Semi-supervised learning algorithms make use of at least one of the following assumptions.


抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Semi-supervised learning」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.